Goto

Collaborating Authors

 everyday robot


Performer-MPC: Navigation via real-time, on-robot transformers – Google AI Blog

#artificialintelligence

Despite decades of research, we don't see many mobile robots roaming our homes, offices, and streets. Real-world robot navigation in human-centric environments remains an unsolved problem. These challenging situations require safe and efficient navigation through tight spaces, such as squeezing between coffee tables and couches, maneuvering in tight corners, doorways, untidy rooms, and more. An equally critical requirement is to navigate in a manner that complies with unwritten social norms around people, for example, yielding at blind corners or staying at a comfortable distance. Google Research is committed to examining how advances in ML may enable us to overcome these obstacles.


Alphabet Layoffs Hit Trash-Sorting Robots

WIRED

Teach a robot to open a door, and it ought to unlock a lifetime of opportunities. Just over a year after graduating from Alphabet's X moonshot lab, the team that trained over a hundred wheeled, one-armed robots to squeegee cafeteria tables, separate trash and recycling, and yes, open doors, is shutting down as part of budget cuts spreading across the Google parent, a spokeswoman confirmed. "Everyday Robots will no longer be a separate project within Alphabet," says Denise Gamboa, director of marketing and communications for Everyday Robots. "Some of the technology and part of the team will be consolidated into existing robotics efforts within Google Research." The robotics venture is the latest failed bet for X, which in the past decade also spun out internet-beaming balloons (Loon) and power-generating kites (Makani) before deeming them too commercially inviable to keep afloat.


Real-world AI assistant: Google combines a large language model with an everyday robot

#artificialintelligence

In the PaLM-SayCan project, Google is combining current robotics technology with advances in large language models. Advances in large-scale AI language models have so far mainly arrived in our digital lives, such as text translation, text and image generation, or behind the scenes, when tech platforms use language AI to moderate the content. In the PaLM-SayCan project, various Google divisions are now combining the company's most advanced large-scale speech model to date with an everyday robot that could one day help in the home – an assistant for the real world. But that will take a while yet. Google unveiled the giant AI language model PaLM in early April, crediting the model with "breakthrough capabilities" in language understanding and, specifically, reasoning. PaLM stands for "Pathways Language Model" – making it a building block in Google's grand Pathways AI strategy for next-generation AI that can efficiently handle thousands or millions of tasks.


Alphabet is putting its prototype robots to work cleaning up around Google's offices

#artificialintelligence

What does Google's parent company Alphabet want with robots? Well, it would like them to clean up around the office, for a start. The company announced today that its Everyday Robots Project -- a team within its experimental X labs dedicated to creating "a general-purpose learning robot" -- has moved some of its prototype machines out of the lab and into Google's Bay Area campuses to carry out some light custodial tasks. "We are now operating a fleet of more than 100 robot prototypes that are autonomously performing a range of useful tasks around our offices," said Everyday Robot's chief robot officer Hans Peter Brøndmo in a blog post. "The same robot that sorts trash can now be equipped with a squeegee to wipe tables and use the same gripper that grasps cups can learn to open doors." These robots in question are essentially arms on wheels, with a multipurpose gripper on the end of a flexible arm attached to a central tower.


Inside X's Mission to Make Robots Boring

WIRED

These creatures are targeting tabletops. One of them will wheel up to a table and ponder for a few seconds to determine if people are seated; if so, it moves on until finding one that's empty. After lingering for a second--maybe taking the algorithmic equivalent of a deep breath before the "Let's do it" moment--the robot twirls and unfurls its limb, stretching the arm over the table to methodically cover the surface with a clear disinfectant. Then it withdraws the arm to squeeze out the excess fluid into a bucket on its base. Task completed, it moves on, seeking another table to swipe.


Alphabet X's new Everyday Robot project wants to build robots that can learn from the world around them

#artificialintelligence

Today, Alphabet's X moonshot division (formerly known as Google X) unveiled the Everyday Robot project, whose goal is to develop a "general-purpose learning robot." The idea is that its robots could use cameras and complex machine learning algorithms to see and learn from the world around them without needing to be coded for every individual movement. The team is testing robots that can help out in workplace environments, though right now, these early robots are focused on learning how to sort trash. Here's what one of them looks like -- it reminds me of a very tall, one-armed Wall-E (ironic, given what the robots are tasked to do): The concept of grasping something comes pretty easily to most humans, but it's a very challenging thing to teach a robot, and Everyday Robot's robots get their practice in both the physical world and the virtual world. In a tour of X's offices, Wired described how a "playpen" of nearly 30 of the robots (supervised by humans) spend their daytime hours sorting trash into trays for compost, landfill, and recycling.


Alphabet's Dream of an 'Everyday Robot' Is Just Out of Reach

#artificialintelligence

During a recent visit to Alphabet's X lab, I drained my coffee and left the compostable cup on a tray marked "Cans & Bottles." The transgression was soon mended. Twenty minutes later, a wheeled, one-armed, chest-high robot whirred along and inspected the cup with the 3D cameras inside its flattened head. Its arm reached out and used two sturdy yellow fingers to move the misplaced cup onto the adjacent green tray labeled "Compostables." The trash-literate robot--part of a project called Everyday Robot--has been in development for years, but X just began discussing it publicly.


The Public Access Weekly: Everyday robots

Engadget

Tomorrow is International Cosplay Day, so a quick shout-out to all you creative folks out there who spend so much time stitching together costumes to bring fantasy to life: Y'all are rad! It is very cool to see the myriad ways in which people bring their favorite fictional characters to life, and I have complete respect for the time, dedication and imagination that it takes to cosplay. If any of you are planning on participating, we would love to see pictures and hear about any events you attended! Additionally, and I really, REALLY shouldn't have to say this (or get shouty about it) but don't post things on Public Access if you haven't written them. We do not, ever, in any-freaking-way tolerate plagiarism -- and yes, rephrasing someone else's article word-for-word counts as plagiarism.